Search results for "Gesture-Based Interaction"

showing 3 items of 3 documents

Extending and validating gestUI using technical action research

2017

gestUI is a model-driven method with tool support to define custom gestures and to include gesture-based interaction in existing user software system interfaces. So far, gestUI had been limited to the definition of the same gesture catalogue for all users of the software system. In this paper, we extend gestUI to permit individual users to define their own custom gesture catalogue and redefine some custom gestures in case of difficulty in using or remembering them. After extending gestUI, we applied technical action research from the FP7 CaaS project's Capability Design Tool with the aim of assessing its acceptance in an industrial setting. We also analysed its perceived ease-of-use and use…

Model-Driven DevelopmentComputer sciencebusiness.industryDesign toolIndustrial setting020207 software engineeringContext (language use)02 engineering and technologyGesture-Based InteractionHuman-Computer InteractionUser ExperienceUser experience designHuman–computer interactionTechnical-Action Research0202 electrical engineering electronic engineering information engineering020201 artificial intelligence & image processingSoftware systemUser interfaceAction researchbusinessSoftware engineeringGesture2017 11th International Conference on Research Challenges in Information Science (RCIS)
researchProduct

Real-Time Hand Pose Recognition Based on a Neural Network Using Microsoft Kinect

2013

The Microsoft Kinect sensor is largely used to detect and recognize body gestures and layout with enough reliability, accuracy and precision in a quite simple way. However, the pretty low resolution of the optical sensors does not allow the device to detect gestures of body parts, such as the fingers of a hand, with the same straightforwardness. Given the clear application of this technology to the field of the user interaction within immersive multimedia environments, there is the actual need to have a reliable and effective method to detect the pose of some body parts. In this paper we propose a method based on a neural network to detect in real time the hand pose, to recognize whether it…

Settore ING-INF/05 - Sistemi Di Elaborazione Delle InformazioniArtificial neural networkgesture recognitionbusiness.industryComputer scienceMicrosoft Kinect.gesture-based interactionVirtual realityObject detectionhuman-computer interactionFeature (computer vision)Gesture recognitionComputer visionArtificial intelligenceNoise (video)businessPoseGesture2013 Eighth International Conference on Broadband and Wireless Computing, Communication and Applications
researchProduct

An empirical comparative evaluation of gestUI to include gesture-based interaction in user interfaces

2019

[EN] Currently there are tools that support the customisation of users' gestures. In general, the inclusion of new gestures implies writing new lines of code that strongly depend on the target platform where the system is run. In order to avoid this platform dependency, gestUI was proposed as a model-driven method that permits (i) the definition of custom touch-based gestures, and (ii) the inclusion of the gesture-based interaction in existing user interfaces on desktop computing platforms. The objective of this work is to compare gestUI (a MDD method to deal with gestures) versus a code-centric method to include gesture-based interaction in user interfaces. In order to perform the comparis…

Code-centric methodComputer scienceModel-driven methodHumanitiesLENGUAJES Y SISTEMAS INFORMATICOSSoftwareHuman-computer interactionComparative empirical evaluationComparative evaluationGesture-based interaction
researchProduct